Russ Allbery: Review: An Informal History of the Hugos
Publisher: | Tor |
Copyright: | August 2018 |
ISBN: | 1-4668-6573-3 |
Format: | Kindle |
Pages: | 564 |
Publisher: | Tor |
Copyright: | August 2018 |
ISBN: | 1-4668-6573-3 |
Format: | Kindle |
Pages: | 564 |
parser.peg
. Here is a simplified rule:
ConditionIPExpr "condition on IP" column:("ExporterAddress"i return "ExporterAddress", nil / "SrcAddr"i return "SrcAddr", nil / "DstAddr"i return "DstAddr", nil ) _ operator:("=" / "!=") _ ip:IP return fmt.Sprintf("%s %s IPv6StringToNum(%s)", toString(column), toString(operator), quote(ip)), nil
ConditionIPExpr
. It case-insensitively matches
ExporterAddress
, SrcAddr
, or DstAddr
. The action for each case returns the
proper case for the column name. That s what is stored in the column
variable.
Then, it matches one of the possible operators. As there is no code block, it
stores the matched string directly in the operator
variable. Then, it tries to
match the IP
rule, which is defined elsewhere in the grammar. If it succeeds,
it stores the result of the match in the ip
variable and executes the final
action. The action turns the column, operator, and IP into a proper expression
for ClickHouse. For example, if we have ExporterAddress = 203.0.113.15
, we
get ExporterAddress = IPv6StringToNum('203.0.113.15')
.
The IP
rule uses a rudimentary regular expression but checks if the matched
address is correct in the action block, thanks to netip.ParseAddr()
:
IP "IP address" [0-9A-Fa-f:.]+ ip, err := netip.ParseAddr(string(c.text)) if err != nil return "", errors.New("expecting an IP address") return ip.String(), nil
WHERE
clause accepted by
ClickHouse:3
WHERE InIfBoundary = 'external' AND ExporterRegion = 'france' AND InIfConnectivity = 'transit' AND SrcAS = 15169 AND DstAddr BETWEEN toIPv6('2a01:e0f:ffff::') AND toIPv6('2a01:e0f:ffff:ffff:ffff:ffff:ffff:ffff')
InputFilter
, uses CodeMirror as its foundation and
leverages features such as syntax highlighting, linting, and completion. The
source code for these capabilities can be found in the
codemirror/lang-filter/
directory.
@top Filter expression expression Not expression "(" expression ")" "(" expression ")" And expression "(" expression ")" Or expression comparisonExpression And expression comparisonExpression Or expression comparisonExpression comparisonExpression Column Operator Value Value String Literal ValueLParen ListOfValues ValueRParen ListOfValues ListOfValues ValueComma (String Literal) String Literal // [ ] @tokens // [ ] Column std.asciiLetter (std.asciiLetter std.digit)* Operator $[a-zA-Z!=><]+ String '"' (![\\\n"] "\\" _)* '"'? "'" (![\\\n'] "\\" _)* "'"? Literal (std.digit std.asciiLetter $[.:/])+ // [ ]
SrcAS = 12322 AND (DstAS = 1299 OR SrcAS = 29447)
is parsed to:
Filter(Column, Operator, Value(Literal), And, Column, Operator, Value(Literal), Or, Column, Operator, Value(Literal))
export const FilterLanguage = LRLanguage.define( parser: parser.configure( props: [ styleTags( Column: t.propertyName, String: t.string, Literal: t.literal, LineComment: t.lineComment, BlockComment: t.blockComment, Or: t.logicOperator, And: t.logicOperator, Not: t.logicOperator, Operator: t.compareOperator, "( )": t.paren, ), ], ), );
/api/v0/console/filter/validate
endpoint accepts a filter and returns a JSON
structure with the errors that were found:
"message": "at line 1, position 12: string literal not terminated", "errors": [ "line": 1, "column": 12, "offset": 11, "message": "string literal not terminated", ]
/api/v0/console/filter/complete
endpoint.
Walking the syntax tree was not as easy as I thought, but unit tests helped
a lot.
The backend uses the parser generated by pigeon to complete a column name
or a comparison operator. For values, the completions are either static or
extracted from the ClickHouse database. A user can complete an AS number from
an organization name thanks to the following snippet:
results := []struct Label string ch:"label" Detail string ch:"detail" columnName := "DstAS" sqlQuery := fmt.Sprintf( SELECT concat('AS', toString(%s)) AS label, dictGet('asns', 'name', %s) AS detail FROM flows WHERE TimeReceived > date_sub(minute, 1, now()) AND detail != '' AND positionCaseInsensitive(detail, $1) >= 1 GROUP BY label, detail ORDER BY COUNT(*) DESC LIMIT 20 , columnName, columnName) if err := conn.Select(ctx, &results, sqlQuery, input.Prefix); err != nil c.r.Err(err).Msg("unable to query database") break for _, result := range results completions = append(completions, filterCompletion Label: result.Label, Detail: result.Detail, Quoted: false, )
Barbie No, seriously! If anyone can make a good film about a doll franchise, it's probably Greta Gerwig. Not only was Little Women (2019) more than admirable, the same could be definitely said for Lady Bird (2017). More importantly, I can't help feel she was the real 'Driver' behind Frances Ha (2012), one of the better modern takes on Claudia Weill's revelatory Girlfriends (1978). Still, whenever I remember that Barbie will be a film about a billion-dollar toy and media franchise with a nettlesome history, I recall I rubbished the "Facebook film" that turned into The Social Network (2010). Anyway, the trailer for Barbie is worth watching, if only because it seems like a parody of itself.
Blitz It's difficult to overstate just how important the aerial bombing of London during World War II is crucial to understanding the British psyche, despite it being a constructed phenomenon from the outset. Without wishing to underplay the deaths of over 40,000 civilian deaths, Angus Calder pointed out in the 1990s that the modern mythology surrounding the event "did not evolve spontaneously; it was a propaganda construct directed as much at [then neutral] American opinion as at British." It will therefore be interesting to see how British Grenadian Trinidadian director Steve McQueen addresses a topic so essential to the British self-conception. (Remember the controversy in right-wing circles about the sole Indian soldier in Christopher Nolan's Dunkirk (2017)?) McQueen is perhaps best known for his 12 Years a Slave (2013), but he recently directed a six-part film anthology for the BBC which addressed the realities of post-Empire immigration to Britain, and this leads me to suspect he sees the Blitz and its surrounding mythology with a more critical perspective. But any attempt to complicate the story of World War II will be vigorously opposed in a way that will make the recent hullabaloo surrounding The Crown seem tame. All this is to say that the discourse surrounding this release may be as interesting as the film itself.
Dune, Part II Coming out of the cinema after the first part of Denis Vileneve's adaptation of Dune (2021), I was struck by the conception that it was less of a fresh adaptation of the 1965 novel by Frank Herbert than an attempt to rehabilitate David Lynch's 1984 version and in a broader sense, it was also an attempt to reestablish the primacy of cinema over streaming TV and the myriad of other distractions in our lives. I must admit I'm not a huge fan of the original novel, finding within it a certain prurience regarding hereditary military regimes and writing about them with a certain sense of glee that belies a secret admiration for them... not to mention an eyebrow-raising allegory for the Middle East. Still, Dune, Part II is going to be a fantastic spectacle.
Ferrari It'll be curious to see how this differs substantially from the recent Ford v Ferrari (2019), but given that Michael Mann's Heat (1995) so effectively re-energised the gangster/heist genre, I'm more than willing to kick the tires of this about the founder of the eponymous car manufacturer. I'm in the minority for preferring Mann's Thief (1981) over Heat, in part because the former deals in more abstract themes, so I'd have perhaps prefered to look forward to a more conceptual film from Mann over a story about one specific guy.
How Do You Live There are a few directors one can look forward to watching almost without qualification, and Hayao Miyazaki (My Neighbor Totoro, Kiki's Delivery Service, Princess Mononoke Howl's Moving Castle, etc.) is one of them. And this is especially so given that The Wind Rises (2013) was meant to be the last collaboration between Miyazaki and Studio Ghibli. Let's hope he is able to come out of retirement in another ten years.
Indiana Jones and the Dial of Destiny Given I had a strong dislike of Indiana Jones and the Kingdom of the Crystal Skull (2008), I seriously doubt I will enjoy anything this film has to show me, but with 1981's Raiders of the Lost Ark remaining one of my most treasured films (read my brief homage), I still feel a strong sense of obligation towards the Indiana Jones name, despite it feeling like the copper is being pulled out of the walls of this franchise today.
Kafka I only know Polish filmmaker Agnieszka Holland through her Spoor (2017), an adaptation of Olga Tokarczuk's 2009 eco-crime novel Drive Your Plow Over the Bones of the Dead. I wasn't an unqualified fan of Spoor (nor the book on which it is based), but I am interested in Holland's take on the life of Czech author Franz Kafka, an author enmeshed with twentieth-century art and philosophy, especially that of central Europe. Holland has mentioned she intends to tell the story "as a kind of collage," and I can hope that it is an adventurous take on the over-furrowed biopic genre. Or perhaps Gregor Samsa will awake from uneasy dreams to find himself transformed in his bed into a huge verminous biopic.
The Killer It'll be interesting to see what path David Fincher is taking today, especially after his puzzling and strangely cold Mank (2020) portraying the writing process behind Orson Welles' Citizen Kane (1941). The Killer is said to be a straight-to-Netflix thriller based on the graphic novel about a hired assassin, which makes me think of Fincher's Zodiac (2007), and, of course, Se7en (1995). I'm not as entranced by Fincher as I used to be, but any film with Michael Fassbender and Tilda Swinton (with a score by Trent Reznor) is always going to get my attention.
Killers of the Flower Moon In Killers of the Flower Moon, Martin Scorsese directs an adaptation of a book about the FBI's investigation into a conspiracy to murder Osage tribe members in the early years of the twentieth century in order to deprive them of their oil-rich land. (The only thing more quintessentially American than apple pie is a conspiracy combined with a genocide.) Separate from learning more about this disquieting chapter of American history, I'd love to discover what attracted Scorsese to this particular story: he's one of the few top-level directors who have the ability to lucidly articulate their intentions and motivations.
Napoleon It often strikes me that, despite all of his achievements and fame, it's somehow still possible to claim that Ridley Scott is relatively underrated compared to other directors working at the top level today. Besides that, though, I'm especially interested in this film, not least of all because I just read Tolstoy's War and Peace (read my recent review) and am working my way through the mind-boggling 431-minute Soviet TV adaptation, but also because several auteur filmmakers (including Stanley Kubrick) have tried to make a Napoleon epic and failed.
Oppenheimer In a way, a biopic about the scientist responsible for the atomic bomb and the Manhattan Project seems almost perfect material for Christopher Nolan. He can certainly rely on stars to queue up to be in his movies (Robert Downey Jr., Matt Damon, Kenneth Branagh, etc.), but whilst I'm certain it will be entertaining on many fronts, I fear it will fall into the well-established Nolan mould of yet another single man struggling with obsession, deception and guilt who is trying in vain to balance order and chaos in the world.
The Way of the Wind Marked by philosophical and spiritual overtones, all of Terrence Malick's films are perfumed with themes of transcendence, nature and the inevitable conflict between instinct and reason. My particular favourite is his stunning Days of Heaven (1978), but The Thin Red Line (1998) and A Hidden Life (2019) also touched me ways difficult to relate, and are one of the few films about the Second World War that don't touch off my sensitivity about them (see my remarks about Blitz above). It is therefore somewhat Malickian that his next film will be a biblical drama about the life of Jesus. Given Malick's filmography, I suspect this will be far more subdued than William Wyler's 1959 Ben-Hur and significantly more equivocal in its conviction compared to Paolo Pasolini's ardently progressive The Gospel According to St. Matthew (1964). However, little beyond that can be guessed, and the film may not even appear until 2024 or even 2025.
Zone of Interest I was mesmerised by Jonathan Glazer's Under the Skin (2013), and there is much to admire in his borderline 'revisionist gangster' film Sexy Beast (2000), so I will definitely be on the lookout for this one. The only thing making me hesitate is that Zone of Interest is based on a book by Martin Amis about a romance set inside the Auschwitz concentration camp. I haven't read the book, but Amis has something of a history in his grappling with the history of the twentieth century, and he seems to do it in a way that never sits right with me. But if Paul Verhoeven's Starship Troopers (1997) proves anything at all, it's all in the adaption.
kubeadm
install.
After passing the two corresponding certifications, my opinion on cloud operators is that it is very much a step back in the direction of proprietary software. You can rebuild their cloud stack with opensource components, but it is also a lot of integration work, similar to using the Linux from scratch distribution instead of something like Debian. A good middle point are the OpenShift and OKD Kubernetes distributions, who integrate the most common cloud components, but allow an installation on your own hardware or cloud provider of your choice.
AWS | Azure | OpenShift | *OpenShift upstream project& |
---|---|---|---|
Cloud Trail | Kubernetes API Server audit log | Kubernetes | |
Cloud Watch | Azure Monitor, Azure Log Analytics | OpenShift Monitoring | Prometheus, Kubernetes Metrics |
AWS Artifact | Compliance Operator | OpenSCAP | |
AWS Trusted Advisor | Azure Advisor | Insights | |
AWS Marketplace | Red Hat Market place | Operator Hub | |
AWS Identity and Access Management (IAM) | Azure Active Directory, Azure AD DS | Red Hat SSO | Keycloack |
AWS Elastisc Beanstalk | Azure App Services | OpenShift Source2Image (S2I) | Source2Image (S2I) |
AWS S3 | Azure Blob Storage** |
ODF Rados Gateway | Rook RGW |
AWS Elastic Block Storage | Azure Disk Storage | ODF Rados Block Device | Rook RBD |
AWS Elastic File System | Azure Files | ODF Ceph FS | Rook CephFS |
AWS ELB Classic | Azure Load Balancer | MetalLB Operator | MetalLB |
AWS ELB Application Load Balancer | Azure Application Gateway | OpenShift Router | HAProxy |
Amazon Simple Notification Service | OpenShift Streams for Apache Kafka | Apache Kafka | |
Amazon Guard Duty | Microsoft Defender for Cloud | API Server audit log review, ACS Runtime detection | Stackrox |
Amazon Inspector | Microsoft Defender for Cloud | Quay.io container scanner, ACS Vulnerability Assessment | Clair, Stackrox |
AWS Lambda | Azure Serverless | Openshift Serverless* |
Knative |
AWS Key Management System | Azure Key Vault | could be done with Hashicorp Vault | Vault |
AWS WAF | NGINX Ingress Controller Operator with ModSecurity | NGINX ModSecurity | |
Amazon Elasticache | Redis Enterprise Operator | Redis, memcached as alternative | |
AWS Relational Database Service | Azure SQL | Crunchy Data Operator | PostgreSQL |
Azure Arc | OpenShift ACM | Open Cluster Management | |
AWS Scaling Group | Azure Scale Set | OpenShift Autoscaler | OKD Autoscaler |
*
OpenShift Serverless requires the application to be packaged as a container, something AWS Lambda does not require.
**
Azure Blob Storage covers the object storage use case of S3, but is itself not S3 compatible
Series: | Edinburgh Nights #1 |
Publisher: | Tor |
Copyright: | 2021 |
Printing: | 2022 |
ISBN: | 1-250-76777-6 |
Format: | Kindle |
Pages: | 329 |
Publisher: | Penguin Books |
Copyright: | 2005 |
ISBN: | 1-4406-2476-3 |
Format: | Kindle |
Pages: | 835 |
On one thing, however, all were agreed resisters and politicians alike: "planning". The disasters of the inter-war decades the missed opportunities after 1918, the great depression that followed the stock-market crash of 1929, the waste of unemployment, the inequalities, injustices and inefficiencies of laissez-faire capitalism that had led so many into authoritarian temptation, the brazen indifference of an arrogant ruling elite and the incompetence of an inadequate political class all seemed to be connected by the utter failure to organize society better. If democracy was to work, if it was to recover its appeal, it would have to be planned.It's one thing to be familiar with the basic economic and political arguments between degrees of free market and planned economies. It's quite another to understand how the appeal of one approach or the discredit of another stems from recent historical experience, and that's what a good history can provide. Judt does not hesitate to draw these sorts of conclusions, and I'm sure some of them are controversial. But while he's opinionated, he's rarely ideological, and he offers no grand explanations. His discussion of the Yugoslav Wars stands out as an example: he mentions various theories of blame (a fraught local ethnic history, the decision by others to not intervene until the situation was truly dire), but largely discards them. Judt's offered explanation is that local politicians saw an opportunity to gain power by inflaming ethnic animosity, and a large portion of the population participated in this process, either passively or eagerly. Other explanations are both unnecessarily complex and too willing to deprive Yugoslavs of agency. I found this refreshingly blunt. When is more complex analysis a way to diffuse responsibility and cling to an ideological fantasy that the right foreign policy would have resolved a problem? A few personal grumblings do creep in, particularly in the chapters on the 1970s (and I think it's not a coincidence that this matches Judt's own young adulthood, a time when one is prone to forming a lot of opinions). There is a brief but stinging criticism of postmodernism in scholarship, which I thought was justified but probably incomplete, and a decidedly grumpy dismissal of punk music, which I thought was less fair. But these are brief asides that don't detract from the overall work. Indeed, they, along with the occasional wry asides ("respecting long-established European practice, no one asked the Poles for their views [on Poland's new frontiers]") add a lot of character. Insofar as this book has a thesis, it's in the implications of the title: Europe only exited the postwar period at the end of the 20th century. Political stability through exhaustion, the overwhelming urgency of economic recovery, and the degree to which the Iron Curtain and the Cold War froze eastern Europe in amber meant that full European recovery from World War II was drawn out and at times suspended. It's only after 1989 and its subsequent upheavals that European politics were able to move beyond postwar concerns. Some of that movement was a reemergence of earlier European politics of nations and ethnic conflict. But, new on the scene, was a sense of identity as Europeans, one that western Europe circled warily and eastern Europe saw as the only realistic path forward.
What binds Europeans together, even when they are deeply critical of some aspect or other of its practical workings, is what it has become conventional to call in disjunctive but revealing contrast with "the American way of life" the "European model of society".Judt also gave me a new appreciation of how traumatic people find the assignment of fault, and how difficult it is to wrestle with guilt without providing open invitations to political backlash. People will go to great lengths to not feel guilty, and pressing the point runs a substantial risk of creating popular support for ideological movements that are willing to lie to their followers. The book's most memorable treatment of this observation is in the epilogue, which traces popular European attitudes towards the history of the Holocaust through the whole time period. The largest problem with this book is that it is dense and very long. I'm a fairly fast reader, but this was the only book I read through most of my holiday vacation and it still took a full week into the new year to finish it. By the end, I admit I was somewhat exhausted and ready to be finished with European history for a while (although the epilogue is very much worth waiting for). If you, unlike me, can read a book slowly among other things, that may be a good tactic. But despite feeling like this was a slog at times, I'm very glad that I read it. I'm not sure if someone with a firmer grounding in European history would have gotten as much out of it, but I, at least, needed something this comprehensive to wrap my mind around the timeline and fill in some embarrassing gaps. Judt is not the most entertaining writer (although he has his moments), and this is not the sort of popular history that goes out of its way to draw you in, but I found it approachable and clear. If you're looking for a solid survey of modern European history with this type of high-level focus, recommended. Rating: 8 out of 10
The Great War and Modern Memory (1975)
Wartime: Understanding and Behavior in the Second World War (1989)
Paul Fussell
Rather than describe the battles, weapons, geopolitics or big personalities of the two World Wars, Paul Fussell's The Great War and Modern Memory & Wartime are focused instead on how the two wars have been remembered by their everyday participants. Drawing on the memoirs and memories of soldiers and civilians along with a brief comparison with the actual events that shaped them, Fussell's two books are a compassionate, insightful and moving piece of analysis.
Fussell primarily sets himself against the admixture of nostalgia and trauma that obscures the origins and unimaginable experience of participating in these wars; two wars that were, in his view, a "perceptual and rhetorical scandal from which total recovery is unlikely." He takes particular aim at the dishonesty of hindsight:
For the past fifty years, the Allied war has been sanitised and romanticised almost beyond recognition by the sentimental, the loony patriotic, the ignorant and the bloodthirsty. I have tried to balance the scales. [And] in unbombed America especially, the meaning of the war [seems] inaccessible.The author does not engage in any of the customary rose-tinted view of war, yet he remains understanding and compassionate towards those who try to locate a reason within what was quite often senseless barbarism. If anything, his despondency and pessimism about the Second World War (the war that Fussell himself fought in) shines through quite acutely, and this is especially the case in what he chooses to quote from others:
"It was common [ ] throughout the [Okinawa] campaign for replacements to get hit before we even knew their names. They came up confused, frightened, and hopeful, got wounded or killed, and went right back to the rear on the route by which they had come, shocked, bleeding, or stiff. They were forlorn figures coming up to the meat grinder and going right back out of it like homeless waifs, unknown and faceless to us, like unread books on a shelf."It would take a rather heartless reader to fail to be sobered by this final simile, and an even colder one to view Fussell's citation of such an emotive anecdote to be manipulative. Still, stories and cruel ironies like this one infuse this often-angry book, but it is not without astute and shrewd analysis as well, especially on the many qualitative differences between the two conflicts that simply cannot be captured by facts and figures alone. For example:
A measure of the psychological distance of the Second [World] War from the First is the rarity, in 1914 1918, of drinking and drunkenness poems.Indeed so. In fact, what makes Fussell's project so compelling and perhaps even unique is that he uses these non-quantitive measures to try and take stock of what happened. After all, this was a war conducted by humans, not the abstract school of statistics. And what is the value of a list of armaments destroyed by such-and-such a regiment when compared with truly consequential insights into both how the war affected, say, the psychology of postwar literature ("Prolonged trench warfare, whether enacted or remembered, fosters paranoid melodrama, which I take to be a primary mode in modern writing."), the specific words adopted by combatants ("It is a truism of military propaganda that monosyllabic enemies are easier to despise than others") as well as the very grammar of interaction:
The Field Service Post Card [in WW1] has the honour of being the first widespread exemplary of that kind of document which uniquely characterises the modern world: the "Form". [And] as the first widely known example of dehumanised, automated communication, the post card popularised a mode of rhetoric indispensable to the conduct of later wars fought by great faceless conscripted armies.And this wouldn't be a book review without argument-ending observations that:
Indicative of the German wartime conception [of victory] would be Hitler and Speer's elaborate plans for the ultimate reconstruction of Berlin, which made no provision for a library.Our myths about the two world wars possess an undisputed power, in part because they contain an essential truth the atrocities committed by Germany and its allies were not merely extreme or revolting, but their full dimensions (embodied in the Holocaust and the Holodomor) remain essentially inaccessible within our current ideological framework. Yet the two wars are better understood as an abyss in which we were all dragged into the depths of moral depravity, rather than a battle pitched by the forces of light against the forces of darkness. Fussell is one of the few observers that can truly accept and understand this truth and is still able to speak to us cogently on the topic from the vantage point of experience. The Second World War which looms so large in our contemporary understanding of the modern world (see below) may have been necessary and unavoidable, but Fussell convinces his reader that it was morally complicated "beyond the power of any literary or philosophic analysis to suggest," and that the only way to maintain a na ve belief in the myth that these wars were a Manichaean fight between good and evil is to overlook reality. There are many texts on the two World Wars that can either stir the intellect or move the emotions, but Fussell's two books do both. A uniquely perceptive and intelligent commentary; outstanding.
Longitude (1995) Dava Sobel Since Man first decided to sail the oceans, knowing one's location has always been critical. Yet doing so reliably used to be a serious problem if you didn't know where you were, you are far more likely to die and/or lose your valuable cargo. But whilst finding one's latitude (ie. your north south position) had effectively been solved by the beginning of the 17th century, finding one's (east west) longitude was far from trustworthy in comparison. This book first published in 1995 is therefore something of an anachronism. As in, we readily use the GPS facilities of our phones today without hesitation, so we find it difficult to imagine a reality in which knowing something fundamental like your own location is essentially unthinkable. It became clear in the 18th century, though, that in order to accurately determine one's longitude, what you actually needed was an accurate clock. In Longitude, therefore, we read of the remarkable story of John Harrison and his quest to create a timepiece that would not only keep time during a long sea voyage but would survive the rough ocean conditions as well. Self-educated and a carpenter by trade, Harrison made a number of important breakthroughs in keeping accurate time at sea, and Longitude describes his novel breakthroughs in a way that is both engaging and without talking down to the reader. Still, this book covers much more than that, including the development of accurate longitude going hand-in-hand with advancements in cartography as well as in scientific experiments to determine the speed of light: experiments that led to the formulation of quantum mechanics. It also outlines the work being done by Harrison's competitors. 'Competitors' is indeed the correct word here, as Parliament offered a huge prize to whoever could create such a device, and the ramifications of this tremendous financial incentive are an essential part of this story. For the most part, though, Longitude sticks to the story of Harrison and his evolving obsession with his creating the perfect timepiece. Indeed, one reason that Longitude is so resonant with readers is that many of the tropes of the archetypical 'English inventor' are embedded within Harrison himself. That is to say, here is a self-made man pushing against the establishment of the time, with his groundbreaking ideas being underappreciated in his life, or dishonestly purloined by his intellectual inferiors. At the level of allegory, then, I am minded to interpret this portrait of Harrison as a symbolic distillation of postwar Britain a nation acutely embarrassed by the loss of the Empire that is now repositioning itself as a resourceful but plucky underdog; a country that, with a combination of the brains of boffins and a healthy dose of charisma and PR, can still keep up with the big boys. (It is this same search for postimperial meaning I find in the fiction of John le Carr , and, far more famously, in the James Bond franchise.) All of this is left to the reader, of course, as what makes Longitute singularly compelling is its gentle manner and tone. Indeed, at times it was as if the doyenne of sci-fi Ursula K. LeGuin had a sideline in popular non-fiction. I realise it's a mark of critical distinction to downgrade the importance of popular science in favour of erudite academic texts, but Latitude is ample evidence that so-called 'pop' science need not be patronising or reductive at all.
Closed Chambers: The Rise, Fall, and Future of the Modern Supreme Court (1998) Edward Lazarus After the landmark decision by the U.S. Supreme Court in *Dobbs v. Jackson Women's Health Organization that ended the Constitutional right to abortion conferred by Roe v Wade, I prioritised a few books in the queue about the judicial branch of the United States. One of these books was Closed Chambers, which attempts to assay, according to its subtitle, "The Rise, Fall and Future of the Modern Supreme Court". This book is not merely simply a learned guide to the history and functioning of the Court (although it is completely creditable in this respect); it's actually an 'insider' view of the workings of the institution as Lazurus was a clerk for Justice Harry Blackmun during the October term of 1988. Lazarus has therefore combined his experience as a clerk and his personal reflections (along with a substantial body of subsequent research) in order to communicate the collapse in comity between the Justices. Part of this book is therefore a pure history of the Court, detailing its important nineteenth-century judgements (such as Dred Scott which ruled that the Constitution did not consider Blacks to be citizens; and Plessy v. Ferguson which failed to find protection in the Constitution against racial segregation laws), as well as many twentieth-century cases that touch on the rather technical principle of substantive due process. Other layers of Lazurus' book are explicitly opinionated, however, and they capture the author's assessment of the Court's actions in the past and present [1998] day. Given the role in which he served at the Court, particular attention is given by Lazarus to the function of its clerks. These are revealed as being far more than the mere amanuenses they were hitherto believed to be. Indeed, the book is potentially unique in its the claim that the clerks have played a pivotal role in the deliberations, machinations and eventual rulings of the Court. By implication, then, the clerks have plaedy a crucial role in the internal controversies that surround many of the high-profile Supreme Court decisions decisions that, to the outsider at least, are presented as disinterested interpretations of Constitution of the United States. This is of especial importance given that, to Lazarus, "for all the attention we now pay to it, the Court remains shrouded in confusion and misunderstanding." Throughout his book, Lazarus complicates the commonplace view that the Court is divided into two simple right vs. left political factions, and instead documents an ever-evolving series of loosely held but strongly felt series of cabals, quid pro quo exchanges, outright equivocation and pure personal prejudices. (The age and concomitant illnesses of the Justices also appears to have a not insignificant effect on the Court's rulings as well.) In other words, Closed Chambers is not a book that will be read in a typical civics class in America, and the only time the book resorts to the customary breathless rhetoric about the US federal government is in its opening chapter:
The Court itself, a Greek-style temple commanding the crest of Capitol Hill, loomed above them in the dim light of the storm. Set atop a broad marble plaza and thirty-six steps, the Court stands in splendid isolation appropriate to its place at the pinnacle of the national judiciary, one of the three independent and "coequal" branches of American government. Once dubbed the Ivory Tower by architecture critics, the Court has a Corinthian colonnade and massive twenty-foot-high bronze doors that guard the single most powerful judicial institution in the Western world. Lights still shone in several offices to the right of the Court's entrance, and [ ]Et cetera, et cetera. But, of course, this encomium to the inherent 'nobility' of the Supreme Court is quickly revealed to be a narrative foil, as Lazarus soon razes this dangerously na ve conception to the ground:
[The] institution is [now] broken into unyielding factions that have largely given up on a meaningful exchange of their respective views or, for that matter, a meaningful explication or defense of their own views. It is of Justices who in many important cases resort to transparently deceitful and hypocritical arguments and factual distortions as they discard judicial philosophy and consistent interpretation in favor of bottom-line results. This is a Court so badly splintered, yet so intent on lawmaking, that shifting 5-4 majorities, or even mere pluralities, rewrite whole swaths of constitutional law on the authority of a single, often idiosyncratic vote. It is also a Court where Justices yield great and excessive power to immature, ideologically driven clerks, who in turn use that power to manipulate their bosses and the institution they ostensibly serve.Lazurus does not put forward a single, overarching thesis, but in the final chapters, he does suggest a potential future for the Court:
In the short run, the cure for what ails the Court lies solely with the Justices. It is their duty, under the shield of life tenure, to recognize the pathologies affecting their work and to restore the vitality of American constitutionalism. Ultimately, though, the long-term health of the Court depends on our own resolve on whom [we] select to join that institution.Back in 1998, Lazurus might have had room for this qualified optimism. But from the vantage point of 2022, it appears that the "resolve" of the United States citizenry was not muscular enough to meet his challenge. After all, Lazurus was writing before Bush v. Gore in 2000, which arrogated to the judicial branch the ability to decide a presidential election; the disillusionment of Barack Obama's failure to nominate a replacement for Scalia; and many other missteps in the Court as well. All of which have now been compounded by the Trump administration's appointment of three Republican-friendly justices to the Court, including hypocritically appointing Justice Barrett a mere 38 days before the 2020 election. And, of course, the leaking and ruling in Dobbs v. Jackson, the true extent of which has not been yet. Not of a bit of this is Lazarus' fault, of course, but the Court's recent decisions (as well as the liberal hagiographies of 'RBG') most perforce affect one's reading of the concluding chapters. The other slight defect of Closed Chambers is that, whilst it often implies the importance of the federal and state courts within the judiciary, it only briefly positions the Supreme Court's decisions in relation to what was happening in the House, Senate and White House at the time. This seems to be increasingly relevant as time goes on: after all, it seems fairly clear even to this Brit that relying on an activist Supreme Court to enact progressive laws must be interpreted as a failure of the legislative branch to overcome the perennial problems of the filibuster, culture wars and partisan bickering. Nevertheless, Lazarus' book is in equal parts ambitious, opinionated, scholarly and dare I admit it? wonderfully gossipy. By juxtaposing history, memoir, and analysis, Closed Chambers combines an exacting evaluation of the Court's decisions with a lively portrait of the intellectual and emotional intensity that has grown within the Supreme Court's pseudo-monastic environment all while it struggles with the most impactful legal issues of the day. This book is an excellent and well-written achievement that will likely never be repeated, and a must-read for anyone interested in this ever-increasingly important branch of the US government.
Crashed: How a Decade of Financial Crises Changed the World (2018)
Shutdown: How Covid Shook the World's Economy (2021)
Adam Tooze
The economic historian Adam Tooze has often been labelled as an unlikely celebrity, but in the fourteen years since the global financial crisis of 2008, a growing audience has been looking for answers about the various failures of the modern economy. Tooze, a professor of history at New York's Columbia University, has written much that is penetrative and thought-provoking on this topic, and as a result, he has generated something of a cult following amongst economists, historians and the online left.
I actually read two Tooze books this year. The first, Crashed (2018), catalogues the scale of government intervention required to prop up global finance after the 2008 financial crisis, and it characterises the different ways that countries around the world failed to live up to the situation, such as doing far too little, or taking action far too late. The connections between the high-risk subprime loans, credit default swaps and the resulting liquidity crisis in the US in late 2008 is fairly well known today in part thanks to films such as Adam McKay's 2015 The Big Short and much improved economic literacy in media reportage. But Crashed makes the implicit claim that, whilst the specific and structural origins of the 2008 crisis are worth scrutinising in exacting detail, it is the reaction of states in the months and years after the crash that has been overlooked as a result.
After all, this is a reaction that has not only shaped a new economic order, it has created one that does not fit any conventional idea about the way the world 'ought' to be run. Tooze connects the original American banking crisis to the (multiple) European debt crises with a larger crisis of liberalism. Indeed, Tooze somehow manages to cover all these topics and more, weaving in Trump, Brexit and Russia's 2014 annexation of Crimea, as well as the evolving role of China in the post-2008 economic order.
Where Crashed focused on the constellation of consequences that followed the events of 2008, Shutdown is a clear and comprehensive account of the way the world responded to the economic impact of Covid-19. The figures are often jaw-dropping: soon after the disease spread around the world, 95% of the world's economies contracted simultaneously, and at one point, the global economy shrunk by approximately 20%. Tooze's keen and sobering analysis of what happened is made all the more remarkable by the fact that it came out whilst the pandemic was still unfolding. In fact, this leads quickly to one of the book's few flaws: by being published so quickly, Shutdown prematurely over-praises China's 'zero Covid' policy, and these remarks will make a reader today squirm in their chair. Still, despite the regularity of these references (after all, mentioning China is very useful when one is directly comparing economic figures in early 2021, for examples), these are actually minor blemishes on the book's overall thesis.
That is to say, Crashed is not merely a retelling of what happened in such-and-such a country during the pandemic; it offers in effect a prediction about what might be coming next. Whilst the economic responses to Covid averted what could easily have been another Great Depression (and thus showed it had learned some lessons from 2008), it had only done so by truly discarding the economic rule book. The by-product of inverting this set of written and unwritten conventions that have governed the world for the past 50 years, this 'Washington consensus' if you well, has yet to be fully felt.
Of course, there are many parallels between these two books by Tooze. Both the liquidity crisis outlined in Crashed and the economic response to Covid in Shutdown exposed the fact that one of the central tenets of the modern economy ie. that financial markets can be trusted to regulate themselves was entirely untrue, and likely was false from the very beginning. And whilst Adam Tooze does not offer a singular piercing insight (conveying a sense of rigorous mastery instead), he may as well be asking whether we're simply going to lurch along from one crisis to the next, relying on the technocrats in power to fix problems when everything blows up again. The answer may very well be yes.
Looking for the Good War: American Amnesia and the Violent Pursuit of Happiness (2021) Elizabeth D. Samet Elizabeth D. Samet's Looking for the Good War answers the following question what would be the result if you asked a professor of English to disentangle the complex mythology we have about WW2 in the context of the recent US exit of Afghanistan? Samet's book acts as a twenty-first-century update of a kind to Paul Fussell's two books (reviewed above), as well as a deeper meditation on the idea that each new war is seen through the lens of the previous one. Indeed, like The Great War and Modern Memory (1975) and Wartime (1989), Samet's book is a perceptive work of demystification, but whilst Fussell seems to have been inspired by his own traumatic war experience, Samet is not only informed by her teaching West Point military cadets but by the physical and ontological wars that have occurred during her own life as well. A more scholarly and dispassionate text is the result of Samet's relative distance from armed combat, but it doesn't mean Looking for the Good War lacks energy or inspiration. Samet shares John Adams' belief that no political project can entirely shed the innate corruptions of power and ambition and so it is crucial to analyse and re-analyse the role of WW2 in contemporary American life. She is surely correct that the Second World War has been universally elevated as a special, 'good' war. Even those with exceptionally giddy minds seem to treat WW2 as hallowed:
It is nevertheless telling that one of the few occasions to which Trump responded with any kind of restraint while he was in office was the 75th anniversary of D-Day in 2019.What is the source of this restraint, and what has nurtured its growth in the eight decades since WW2 began? Samet posits several reasons for this, including the fact that almost all of the media about the Second World War is not only suffused with symbolism and nostalgia but, less obviously, it has been made by people who have no experience of the events that they depict. Take Stephen Ambrose, author of Steven Spielberg's Band of Brothers miniseries: "I was 10 years old when the war ended," Samet quotes of Ambrose. "I thought the returning veterans were giants who had saved the world from barbarism. I still think so. I remain a hero worshiper." If Looking for the Good War has a primary thesis, then, it is that childhood hero worship is no basis for a system of government, let alone a crusading foreign policy. There is a straight line (to quote this book's subtitle) from the "American Amnesia" that obscures the reality of war to the "Violent Pursuit of Happiness." Samet's book doesn't merely just provide a modern appendix to Fussell's two works, however, as it adds further layers and dimensions he overlooked. For example, Samet provides some excellent insight on the role of Western, gangster and superhero movies, and she is especially good when looking at noir films as a kind of kaleidoscopic response to the Second World War:
Noir is a world ruled by bad decisions but also by bad timing. Chance, which plays such a pivotal role in war, bleeds into this world, too.Samet rightfully weaves the role of women into the narrative as well. Women in film noir are often celebrated as 'independent' and sassy, correctly reflecting their newly-found independence gained during WW2. But these 'liberated' roles are not exactly a ringing endorsement of this independence: the 'femme fatale' and the 'tart', etc., reflect a kind of conditional freedom permitted to women by a post-War culture which is still wedded to an outmoded honour culture. In effect, far from being novel and subversive, these roles for women actually underwrote the ambient cultural disapproval of women's presence in the workforce. Samet later connects this highly-conditional independence with the liberation of Afghan women, which:
is inarguably one of the more palatable outcomes of our invasion, and the protection of women's rights has been invoked on the right and the left as an argument for staying the course in Afghanistan. How easily consequence is becoming justification. How flattering it will be one day to reimagine it as original objective.Samet has ensured her book has a predominantly US angle as well, for she ends her book with a chapter on the pseudohistorical Lost Cause of the Civil War. The legacy of the Civil War is still visible in the physical phenomena of Confederate statues, but it also exists in deep-rooted racial injustice that has been shrouded in euphemism and other psychological devices for over 150 years. Samet believes that a key part of what drives the American mythology about the Second World War is the way in which it subconsciously cleanses the horrors of brother-on-brother murder that were seen in the Civil War. This is a book that is not only of interest to historians of the Second World War; it is a work for anyone who wishes to understand almost any American historical event, social issue, politician or movie that has appeared since the end of WW2. That is for better or worse everyone on earth.
Series: | Trang #2 |
Publisher: | Mary Sisson |
Copyright: | 2012 |
Printing: | December 2013 |
ASIN: | B0087KQDQ0 |
Format: | Kindle |
Pages: | 375 |
VecDeque
, which is great. If you actually need more than VecDeque
can do, use one of the handful of libraries that actually offer a significantly more useful API.
If you are writing your own data structure, check if someone has done it already, and consider slotmap
or generation_arena
, (or maybe Rc
/Arc
).
Contents
generational_token_list
, which makes a plausible alternative to dlv-list
which I already recommended in 2019.)
Why are there so many poor Rust linked list libraries ?
Linked lists and Rust do not go well together. But (and I m guessing here) I presume many people are taught in programming school that a linked list is a fundamental data structure; people are often even asked to write one as a teaching exercise. This is a bad idea in Rust. Or maybe they ve heard that writing linked lists in Rust is hard and want to prove they can do it.
Double-ended queues
One of the main applications for a linked list in a language like C, is a queue, where you put items in at one end, and take them out at the other. The Rust standard library has a data structure for that, VecDeque
.
Five of the available libraries:
VecDeque
: basically, pushing and popping elements at the front and back.VecDeque
,VecDeque
, simply because VecDeque
is in the Rust Standard Library.VecDeque
instead.
The Cursor
concept
A proper linked list lets you identify and hold onto an element in the middle of the list, and cheaply insert and remove elements there.
Rust s ownership and borrowing rules make this awkward. One idea that people have many times reinvented and reimplemented, is to have a Cursor
type, derived from the list, which is a reference to an element, and permits insertion and removal there.
Eight libraries have implemented this in the obvious way. However, there is a serious API limitation:
To prevent a cursor being invalidated (e.g. by deletion of the entry it points to) you can t modify the list while the cursor exists. You can only have one cursor (that can be used for modification) at a time.
The practical effect of this is that you cannot retain cursors. You can make and use such a cursor for a particular operation, but you must dispose of it soon. Attempts to do otherwise will see you losing a battle with the borrow checker.
If that s good enough, then you could just use a VecDeque
and use array indices instead of the cursors. It s true that deleting or adding elements in the middle involves a lot of copying, but your algorithm is O(n) even with the single-cursor list libraries, because it must first walk the cursor to the desired element.
Formally, I believe any algorithm using these exclusive cursors can be rewritten, in an obvious way, to simply iterate and/or copy from the start or end (as one can do with VecDeque
) without changing the headline O() performance characteristics.
IMO the savings available from avoiding extra copies etc. are not worth the additional dependency, unsafe code, and so on, especially as there are other ways of helping with that (e.g. boxing the individual elements).
Even if you don t find that convincing, generational_token_list
and dlv_list
are strictly superior since they offer a more flexible and convenient API and better performance, and rely on much less unsafe code.
Rustic approaches to pointers-to-and-between-nodes data structures
Most of the time a VecDeque
is great. But if you actually want to hold onto (perhaps many) references to the middle of the list, and later modify it through those references, you do need something more. This is a specific case of a general class of problems where the naive approach (use Rust references to the data structure nodes) doesn t work well.
But there is a good solution:
Keep all the nodes in an array (a Vec<Option<T>>
or similar) and use the index in the array as your node reference. This is fast, and quite ergonomic, and neatly solves most of the problems. If you are concerned that bare indices might cause confusion, as newly inserted elements would reuse indices, add a per-index generation count.
These approaches have been neatly packaged up in libraries like slab
, slotmap
, generational-arena
and thunderdome
. And they have been nicely applied to linked lists by the authors of generational_token_list
. and dlv-list
.
The alternative for nodey data structures in safe Rust: Rc
/Arc
Of course, you can just use Rust s interior mutability and reference counting smart pointers, to directly implement the data structure of your choice.
In many applications, a single-threaded data structure is fine, in which case Rc
and Cell
/RefCell
will let you write safe code, with cheap refcount updates and runtime checks inserted to defend against unexpected aliasing, use-after-free, etc.
I took this approach in rc-dlist-deque
, because I wanted each node to be able to be on multiple lists.
Rust s package ecosystem demonstrating software s NIH problem
The Rust ecosystem is full of NIH libraries of all kinds. In my survey, there are: five good options; seven libraries which are plausible, but just not as good as the alternatives; and fourteen others.
There is a whole rant I could have about how the whole software and computing community is pathologically neophilic. Often we seem to actively resist reusing ideas, let alone code; and are ignorant and dismissive of what has gone before. As a result, we keep solving the same problems, badly - making the same mistakes over and over again. In some subfields, working software, or nearly working software, is frequently replaced with something worse, maybe more than once.
One aspect of this is a massive cultural bias towards rewriting rather than reusing, let alone fixing and using.
Many people can come out of a degree, trained to be a programmer, and have no formal training in selecting and evaluating software; this is even though working effectively with computers requires making good use of everyone else s work.
If one isn t taught these skills (when and how to search for prior art, how to choose between dependencies, and so on) one must learn it on the job. The result is usually an ad-hoc and unsystematic approach, often dominated by fashion rather than engineering.
The package naming paradox
The more experienced and competent programmer is aware of all the other options that exist - after all they have evaluated other choices before writing their own library.
So they will call their library something like generational_token_list
or vecdeque-stableix
.
Whereas the novice straight out of a pre-Rust programming course just thinks what they are doing is the one and only obvious thing (even though it s a poor idea) and hasn t even searched for a previous implementation. So they call their package something obvious like linked list .
As a result, the most obvious names seem to refer to the least useful libraries.
\a
in an C escape
sequence) would come in the input stream.
That feature actually predates computers altogether, and was present
in Baudot code, "an early character encoding for telegraphy
invented by mile Baudot in the 1870s", itself superseding Morse
code.
Modern terminal emulators have, of course, kept that feature: if
you run this command in a terminal right now:
printf '\a'
... you may hear some annoying beep. Or not. It actually depends on
a lot of factors, including which terminal emulator you're using, how
it's configured, whether you have headphones on, or speakers
connected, or, if you're really old school, a PC speaker, even.
Typically, I have this theory that it does the exact opposite of what
you want, regardless of whether you have configured it or not. That
is, if you want it to make noises, it won't, and if you want it to
stay silent, it will make brutal, annoying noises in moments you would
the least expect. I suspect this is a law in computer science, but I'm
too lazy to come up with a formal definition.
Yet something can be done with this.
~/.Xresources
):
XTerm*bellIsUrgent: true
XTerm*visualBell: false
Interestingly, this doesn't clearly say "bell is muted", but it's
effectively what it does. Or maybe it works because I have muted
"System Sounds" in Pulseaudio. Who knows. I do have this in my startup
scripts though:
xset b off
... which, according to the xset(1) manpage, means
If the dash or 'off' are given, the bell will be turned off.Interestingly, you have the option of setting the bell "volume", "pitch, in hertz, and [...] duration in milliseconds. Note that not all hardware can vary the bell characteristics." In any case, I think that's the magic trick to turn the darn thing off. Now this should send urgency hints to your window manager:
sleep 3 ; printf '\a'
Try it: run the command, switch to another desktop, then wait 3
seconds. You should see the previous desktop show up in red or
something.
In the i3 window manager I am currently using, this is
the default, although I did set the colors (client.urgent
and
urgent_workspace
in bar.colors
).
Other window managers or desktop environments may require different
configurations.
set beep=yes
set beep_new=yes
/set beep_when_window_active ON
/set beep_when_away ON
/set beep_msg_level MSGS DCC DCCMSGS HILIGHT
It was also recommending this setting, but it appears to be deprecated
and gives a warning in modern irssi versions:
/set bell_beeps ON
# disabled: we want to propagate bell to clients, which should handle
# it in their own terminal settings. this vbell off is also the
# upstream and tmux's default
#
# see also: http://netbuz.org/blog/2011/11/x-bells-and-urgency-hints/
vbell off
# propagate bell from other windows up to the terminal emulator as well
bell_msg 'Bell in window %n^G'
The bell_msg
bit is an extra from me: it uses the bell message that
pops up when screen detects a bell in another window to resend the
bell control character up to the running terminal.
This makes it so a bell in any multiplexed window will also
propagate to the parent terminal, which is not the default.
# listen to alerts from all windows
set -g bell-action any
# notice bell in windows
set -g monitor-bell on
# only propagate bell, don't warn user, as it hangs tmux for a second
set -g visual-bell off
# send bell *and* notify when activity (if monitor-activity)
set -g visual-activity both
Note that this config goes beyond what we have in GNU screen in that
inactivity or activity will trigger a bell as well. This might be
useful for cases where you don't have the prompt hack (below) but it
could also very well be very noisy. It will only generate noise when
monitor-activity
is enabled though.
PROMPT_COMMAND
in bash, it would conflict with my (already
existing and way too) complex bash prompt, and lead to odd
behaviors. It would also not work for remote commands, of course, as
it wouldn't have access to my local D-BUS to send notifications
(thankfully!).
So instead, what I do now is systematically print a bell whenever a
command terminates, in all my shells. I have this in my
/root/.bashrc
on all my servers, deployed in Puppet:
PROMPT_COMMAND='printf "\a"'
Or you can just put it directly in the shell prompt, with something
like:
PS1='\[\a\]'"$PS1"
(I have the equivalent in my own .bashrc
, although that thing is
much more complex, featuring multi-command pipeline exit status,
colors, terminal title setting, and more, which should probably
warrant its own blog post.)
This sounds a little bonkers and really noisy, but remember that I
turned off the audible bell. And urgency hints are going to show up
only if the window is unfocused. So it's actually really nice and not
really distracting.
Or, to reuse the undistract-me concept: it allows me to not lose focus
too much when I'm waiting for a long process to complete.
That idea actually came from ahf, so kudos to him on that nice
hack.
printf
'\a'
to get a notification.
This might not work in Wayland, your window manager, your desktop
environment, your Linux console, or your telegraphy session.
Publisher: | Riverhead Books |
Copyright: | 2021 |
ISBN: | 0-698-40513-7 |
Format: | Kindle |
Pages: | 260 |
module | score | downloads | release | stars | watch | forks | license | docs | contrib | issue | PR | notes |
---|---|---|---|---|---|---|---|---|---|---|---|---|
halyard | 3.1 | 1,807 | 2022-10-14 | 0 | 0 | 0 | MIT | no | requires firewall and Configvault_Write modules? |
|||
voxpupuli | 5.0 | 4,201 | 2022-10-01 | 2 | 23 | 7 | AGPLv3 | good | 1/9 | 1/4 | 1/61 | optionnally configures ferm , uses systemd-networkd, recommends systemd module with manage_systemd to true , purges unknown keys |
abaranov | 4.7 | 17,017 | 2021-08-20 | 9 | 3 | 38 | MIT | okay | 1/17 | 4/7 | 4/28 | requires pre-generated private keys |
arrnorets | 3.1 | 16,646 | 2020-12-28 | 1 | 2 | 1 | Apache-2 | okay | 1 | 0 | 0 | requires pre-generated private keys? |
wireguard::interface 'wg0':
source_addresses => ['2003:4f8:c17:4cf::1', '149.9.255.4'],
public_key => $facts['wireguard_pubkeys']['nodeB'],
endpoint => 'nodeB.example.com:53668',
addresses => [ 'Address' => '192.168.123.6/30', , 'Address' => 'fe80::beef:1/64' ,],
This configuration come from this pull request I sent to the
module to document how to use that fact.
Note that the addresses used here are examples that shouldn't be
reused and do not confirm to RFC5737 ("IPv4 Address Blocks
Reserved for Documentation", 192.0.2.0/24 (TEST-NET-1),
198.51.100.0/24 (TEST-NET-2), and 203.0.113.0/24 (TEST-NET-3)) or
RFC3849 ("IPv6 Address Prefix Reserved for Documentation",
2001:DB8::/32), but that's another story.
(To avoid boostrapping problems, the resubmit-facts configuration
could be used so that other nodes facts are more immediately
available.)
One problem with the above approach is that you explicitly need to
take care of routing, network topology, and addressing. This can get
complicated quickly, especially if you have lots of devices, behind
NAT, in multiple locations (which is basically my life at home,
unfortunately).
Concretely, basic Wireguard only support one peer behind
NAT. There are some workarounds for this, but they generally imply
a relay server of some sort, or some custom registry, it's
kind of a mess. And this is where overlay networks like Tailscale come
in.
curl bash
but they also provide packages for
various platforms. Their Debian install instructions are
surprisingly good, and check most of the third party checklist
we're trying to establish. (It's missing a pin.)
There's also a Puppet module for tailscale, naturally.
What I find a little disturbing with Tailscale is that you not only
need to trust Tailscale with authorizing your devices, you also
basically delegate that trust also to the SSO provider. So, in my
case, GitHub (or anyone who compromises my account there) can
penetrate the VPN. A little scary.
Tailscale is also kind of an "all or nothing" thing. They have
MagicDNS, file transfers, all sorts of things, but those things
require you to hook up your resolver with Tailscale. In fact,
Tailscale kind of assumes you will use their nameservers, and have
suffered great lengths to figure out how to do that. And
naturally, here, it doesn't seem to work reliably; my resolv.conf
somehow gets replaced and the magic resolution of the ts.net
domain
fails.
(I wonder why we can't opt in to just publicly resolve the ts.net
domain. I don't care if someone can enumerate the private IP addreses
or machines in use in my VPN, at least I don't care as much as
fighting with resolv.conf
everywhere.)
Because I mostly have access to the routers on the networks I'm on, I
don't think I'll be using tailscale in the long term. But it's pretty
impressive stuff: in the time it took me to even review the Puppet
modules to configure Wireguard (which is what I'll probably end up
doing), I was up and running with Tailscale (but with a broken DNS,
naturally).
(And yes, basic Wireguard won't bring me DNS either, but at least I
won't have to trust Tailscale's Debian packages, and Tailscale, and
Microsoft, and GitHub with this thing.)
--- ipv6.l.google.com ping statistics ---
10 packets transmitted, 10 received, 0,00% packet loss, time 136,8ms
RTT[ms]: min = 13, median = 14, p(90) = 14, max = 15
--- google.com ping statistics ---
10 packets transmitted, 10 received, 0,00% packet loss, time 136,0ms
RTT[ms]: min = 13, median = 13, p(90) = 14, max = 14
In the case of GitHub, latency is actually lower, interestingly:
--- ipv6.github.com ping statistics ---
10 packets transmitted, 10 received, 0,00% packet loss, time 134,6ms
RTT[ms]: min = 13, median = 13, p(90) = 14, max = 14
--- github.com ping statistics ---
10 packets transmitted, 10 received, 0,00% packet loss, time 293,1ms
RTT[ms]: min = 29, median = 29, p(90) = 29, max = 30
That is because HE.net peers directly with my ISP and Fastly (which
is behind GitHub.com's IPv6, apparently?), so it's only 6 hops
away. While over IPv4, the ping goes over New York, before landing
AWS's Ashburn, Virginia datacenters, for a whopping 13 hops...
I managed setup a HE.net tunnel at home, because I also need IPv6
for other reasons (namely debugging at work). My first attempt at
setting this up in the office failed, but now that I found the
openwrt.org guide, it worked... for a while, and I was able to
produce the above, encouraging, mini benchmarks.
Unfortunately, a few minutes later, IPv6 just went down again. And the
problem with that is that many programs (and especially
OpenSSH) do not respect the Happy Eyeballs protocol (RFC
8305), which means various mysterious "hangs" at random times on
random applications. It's kind of a terrible user experience, on top
of breaking the one thing it's supposed to do, of course, which is to
give me transparent access to all the nodes I maintain.
Even worse, it would still be a problem for other remote nodes I might
setup where I might not have acess to the router to setup the
tunnel. It's also not absolutely clear what happens if you setup the
same tunnel in two places... Presumably, something is smart enough to
distribute only a part of the /48
block selectively, but I don't
really feel like going that far, considering how flaky the setup is
already.
destroy table
and delete table
should be defined consistently, with
the following meanings:
-m nft xyz
. Which feels ugly, but may work. We also explored playing with the semantics of
release version numbers. And another idea: store strings in the nft rule userdata area with the equivalent
matching information for older iptables-nft.
In fact, what Phil may have been looking for is not backwards but forward compatibility. Phil was undecided which path
to follow, but perhaps the most common-sense approach is to fall back to a major release version bump (2.x.y)
and declaring compatibility breakage with older iptables 1.x.y.
That was pretty much it for the first day. We had dinner together and went to sleep for the next day.
The second day was opened by Florian Westphal (Netfilter coreteam member and Red Hat engineer). Florian has been
trying to improve nftables performance in kernels with RETPOLINE mitigations enabled. He commented that several
workarounds have been collected over the years to avoid the performance penalty of such mitigations.
The basic strategy is to avoid function indirect calls in the kernel.
Florian also described how BPF programs work around this more effectively. And actually, Florian tried translating
nf_hook_slow()
to BPF. Some preliminary benchmarks results were showed, with about 2% performance improvement in
MB/s and PPS. The flowtable infrastructure is specially benefited from this approach. The software
flowtable infrastructure already offers a 5x performance improvement with regards the classic forwarding path, and the
change being researched by Florian would be an addition on top of that.
We then moved into discussing the meeting Florian had with Alexei in Zurich. My personal opinion was that
Netfilter offers interesting user-facing interfaces and semantics that BPF does not. Whereas BPF may be more performant
in certain scenarios. The idea of both things going hand in hand may feel natural for some people. Others also
shared my view, but no particular agreement was reached in this topic. Florian will probably continue exploring options
on that front.
The next topic was opened by Fernando. He wanted to discuss Netfilter involvement in Google Summer of Code and Outreachy.
Pablo had some personal stuff going on last year that prevented him from engaging in such projects. After all, GSoC is
not fundamental or a priority for Netfilter. Also, Pablo mentioned the lack of support from others in the project for
mentoring activities. There was no particular decision made here. Netfilter may be present again in such initiatives
in the future, perhaps under the umbrella of other organizations.
Again, Fernando proposed the next topic: nftables JSON support. Fernando shared his plan of going over all features
and introduce programmatic tests from them. He also mentioned that the nftables wiki was incomplete and couldn t be
used as a reference for missing tests. Phil suggested running the nftables python test-suite in JSON mode, which
should complain about missing features. The py test suite should cover pretty much all statements and variations on
how the nftables expression are invoked.
Next, Phil commented on nftables xtables support. This is, supporting legacy xtables extensions in nftables.
The most prominent problem was that some translations had some corner cases that resulted in a listed ruleset that
couldn t be fed back into the kernel. Also, iptables-to-nftables translations can be sloppy, and the resulting
rule won t work in some cases. In general, nft list ruleset nft -f
may fail in rulesets created by iptables-nft
and there is no trivial way to solve it.
Phil also commented on potential iptables-tests.py speed-ups. Running the test suite may take very long time
depending on the hardware. Phil will try to re-architect it, so it runs faster. Some alternatives had been
explored, including collecting all rules into a single iptables-restore run, instead of hundreds of individual
iptables calls.
Next topic was about documentation on the nftables wiki. Phil is interested in having all nftables
code-flows documented, and presented some improvements in that front. We are trying to organize all
developer-oriented docs on a mediawiki portal, but the extension was not active yet. Since I worked at the
Wikimedia Foundation, all the room stared at me, so at the end I kind of committed to exploring and enabling the
mediawiki portal extension. Note to self: is this perhaps https://www.mediawiki.org/wiki/Portals ?
Next presentation was by Pablo. He had a list of assorted topics for quick review and comment.
struct constant_expr
, which
can reduce 12.5% mem usage.nft_tunnel
expression that can do this encapsulation for complete feature parity. It is only available in
the kernel, but it can be made available easily on the userspace utility too.
Also, we discussed some limitations of katran, for example, inability to handle IP fragmentation, IP options, and
potentially others not documented anywhere. This seems to be common with XDP/BPF programs, because handling all
possible network scenarios would over-complicate the BPF programs, and at that point you are probably better off by
using the normal Linux network stack and nftables.
In summary, we agreed that nftlb can pretty much offer the same as katran, in a more flexible way.
Finally, after many interesting debates over two days, the workshop ended. We all agreed on the need for extending
it to 3 days next time, since 2 days feel too intense and too short for all the topics worth discussing.
That s all on my side! I really enjoyed this Netfilter workshop round.
Series: | The Locked Tomb #3 |
Publisher: | Tordotcom |
Copyright: | 2022 |
ISBN: | 1-250-85412-1 |
Format: | Kindle |
Pages: | 480 |
$ git clone https://gitlab.com/gnutls/gnutls.git guile-gnutls
$ cd guile-gnutls/
$ git checkout f5dcbdb46df52458e3756193c2a23bf558a3ecfd
$ git-filter-repo --path guile/ --path m4/guile.m4 --path doc/gnutls-guile.texi --path doc/extract-guile-c-doc.scm --path doc/cha-copying.texi --path doc/fdl-1.3.texi
I debated with myself back and forth whether to include some files that would be named the same in the new repository but would share little to no similar lines, for example configure.ac
, Makefile.am
not to mention README
and NEWS
. Initially I thought it would be nice to preserve the history for all lines that went into the new project, but this is a subjective judgement call. What brought me over to a more minimal approach was that the contributor history and attribution would be quite strange for the new repository: Should Guile-GnuTLS attribute the work of the thousands of commits to configure.ac which had nothing to do with Guile? Should the people who wrote that be mentioned as contributor of Guile-GnuTLS? I think not.
The next step was to get a reasonable GitLab CI/CD pipeline up, to make sure the project builds on some free GNU/Linux distributions like Trisquel and PureOS as well as the usual non-free distributions like Debian and Fedora to have coverage of dpkg and rpm based distributions. I included builds on Alpine and ArchLinux as well, because they tend to trigger other portability issues. I wish there were GNU Guix docker images available for easy testing on that platform as well. The GitLab CI/CD rules for a project like this are fairly simple.
To get things out of the door, I tagged the result as v3.7.9 and published a GitLab release page for Guile-GnuTLS that includes OpenPGP-signed source tarballs manually uploaded built on my laptop. The URLs for these tarballs are not very pleasant to work with, and discovering new releases automatically appears unreliable, but I don t know of a better approach.
To finish this project, I have proposed a GnuTLS merge request to remove all Guile-related parts from the GnuTLS core.
Doing some GnuTLS-related work again felt nice, it was quite some time ago so thank you for giving me this opportunity. Thoughts or comments? Happy hacking!
analyze open source packages for reproducibility. We start with an existing package (for example, the NPM left-pad
package, version 1.3.0), and we try to answer the question, Do the package contents authentically reflect the purported source code?
More details can be found in the README.md
file within the code repository.
best-practices-badge
GitHub project. Essentially, though, the claim is that the reproducibility requirement doesn t make sense for projects that do not release built software, and that timestamp differences by themselves don t necessarily indicate malicious changes. Numerous pragmatic problems around excluding timestamps were raised in the discussion of the issue.
[ ] a massive year-over-year increase in cyberattacks aimed at open source project ecosystems. According to early data from Sonatype s 8th annual State of the Software Supply Chain Report, which will be released in full this October, Sonatype has recorded an average 700% jump in repository attacks over the last three years.More information is available in the press release.
/projects/
to /who/
in order to keep old or archived links working [ ], Jelle van der Waa added a Rust programming language example for SOURCE_DATE_EPOCH
[ ][ ] and Mattia Rizzolo included Protocol Labs amongst our project-level sponsors [ ].
nfft
source package was removed from the archive, and now all packages in Debian bookworm now have a corresponding .buildinfo
file. This can be confirmed and tracked on the associated page on the tests.reproducible-builds.org site.
man-db
asking for an update and whether to remove the patch tag for now. This was subsequently removed and the maintainer started to address the issue.gmp
to DELAYED/15
, fixing #1009931.plymouth
and asked for the maintainer s opinion on the patch. This resulted in the maintainer improving Vagrant s original patch (and uploading it) as well as filing an issue upstream.time
to DELAYED/15
, fixing #983202.mylvmbackup
(#782318)libranlip
. (#788000, #846975 & #1007137)libranlip
to DELAYED/10
.cclive
. (#824501)cclive
to DELAYED/10
.linuxtv-dvb-apps
) and so the bug was marked as done .clhep
).#debian-reproducible
IRC channel.
cmake_rpath_contains_build_path
[ ], nondeterministic_version_generated_by_python_param
[ ] and timestamps_in_documentation_generated_by_org_mode
[ ]. Furthermore, two new issue types were created: build_path_used_to_determine_version_or_package_name
[ ] and captures_build_path_via_cmake_variables
[ ].
222
and 223
to Debian, as well as made the following changes:
cbfstools
utility is now provided in Debian via the coreboot-utils
package so we can enable that functionality within Debian. [ ]
glibc
/seccomp
that was preventing the Docker-contained diffoscope instance from spawning any external processes whatsoever [ ]. I also updated the requirements.txt
file, as some of the specified packages were no longer available [ ][ ].
file
version 5.43 [ ] and Mattia Rizzolo updated the packaging:
coreboot-utils
in the Build-Depends
and Test-Depends
fields so that it is available for tests. [ ]Breaks
/Replaces
that have been obsoleted since the release of Debian bullseye. [ ]0.7.22
was uploaded to Debian unstable by Holger Levsen, which included the following changes by Philip Hands:
setarch(8)
utility can actually execute before including an architecture to test. [ ]*.*deb
in the default artifact_pattern
in order to archive all results of the build. [ ]head(1)
utility. [ ]DateTime
(fails to build in 2038)FreeRCT
(date-related issue)clanlib1
(filesystem ordering)cli
(fails to build in 2038)deepin-gettext-tools
(patch+version update toolchain sort python glob)mariadb
(fails to build in 2038)mercurial
(fails to build in 2038)mirrormagic
(parallelism-related issue)ocaml-extlib
(parallelism-related issue)python-xmlrpc/python-softlayer
(fails to build in 2038)python
(fails to build in 2038)q3rally
(zip-related issue)rnd_jue
(parallelism-related issue)rsync
(workaround an issue in GCC 7.x)scons
(SOURCE_DATE_EPOCH
-related issue)stratagus
(date-related issue)triplane
(nondeterminism caused by uninitialised memory)tyrquake
(date-related issue)gnome-online-accounts
.LANGUAGE
environment variable inconsistently affecting the output of objects.inv
files.mp4v2
(date-related issue)mm-common
(uid/gid issue)aardvark-dns
(date-related issue)extrepo-data
.tmpreaper
.xmlrpc-epi
.pal
.nvram-wakeup
.netris
.netpbm-free
.lookup
.logtools
.libid3tag
.log4cpp
.libimage-imlib2-perl
.jnettop
.gwaei
.ipfm
.tarlz
.w3cam
.ifstat
.xserver-xorg-input-joystick
.chibicc
.python-omegaconf
.snapper
.libreswan
.pure-ftpd
.xcolmix
.gigalomania
.xjump
.waili
.sjeng
.seqtk
.shapetools
.rotter
.rakarrack
.rig
.postal
.netkit-rsh
.libapache-mod-evasive
.paxctl
.png23d
.perl-byacc
.poster
.powerdebug
.aespipe
.aewm++-goodies
.apache-upload-progress-module
.ascii2binary
.bible-kjv
.dradio
.libapache2-mod-python
.tempest-for-eliza
.aplus-fsf
.wrapsrv
.uclibc
.xppaut
.xvier
.xserver-xorg-video-glide
.z80asm
.yaskkserv
.edid-decode
.dustmite
.dustmite
.libapache2-mod-authnz-pam
.kafs-client
.yaku-ns
.bplay
.chise-base
.checkpw
.clamz
.libapache2-mod-auth-pgsql
.scp -p
in order to preserve modification times when syncing live ISO images. [ ]scp(1)
correctly [ ] and consquently add support to use both scp -p
and regular scp
[ ].#reproducible-builds
on irc.oftc.net
.
rb-general@lists.reproducible-builds.org
Next.